If two variables are independent, their covariance and correlation is zero but vice-versa is not true, why?



Explanation using example


Let X be a random variable that is −1 or +1 with probability 0.5. Then let Y be a random variable such that Y=0 if X=−1, and Y is randomly −1 or +1 with probability 0.5 if X=1.

Clearly X and Y are highly dependent (since knowing Y allows me to perfectly know X), but their covariance is zero: They both have zero mean, and


E[XY] = (−1) ⋅ 0 ⋅ P(X=−1) 
+ 1 ⋅ 1 ⋅ P(X=1,Y=1) 
+ 1 ⋅ (−1) ⋅ P(X=1,Y=−1)
=0


Or more generally, take any distribution P(X) and any P(Y|X) such that P(Y=a|X)=P(Y=−a|X) for all X (i.e., a joint distribution that is symmetric around the xx axis), and you will always have zero covariance. But you will have non-independence whenever P(Y|X)≠P(Y); i.e., the conditionals are not all equal to the marginal. Or ditto for symmetry around the y axis.



Source: https://stats.stackexchange.com/questions/12842/covariance-and-independence

copy

need to understand fully

not indexed

Comments